Consider a nonlinear regression model : y_{i}=g(x_{i},{\theta})+e_{i},i=1,...,n, where the x_{i} are random predictors x_{i} and {\theta} is theunknown parameter vector ranging in a set {\Theta}\subsetR^{p}. All knownresults on the consistency of the least squares estimator and in general of Mestimators assume that either {\Theta} is compact or g is bounded, whichexcludes frequently employed models such as the Michaelis-Menten, logisticgrowth and exponential decay models. In this article we deal with the so-calledseparable models, where p=p_{1}+p_{2}, {\theta}=({\alpha},{\beta}) with{\alpha}\inA\subsetR^{p_{1}}, {\beta}\inB\subsetR^{p_{2},}and g has the formg(x,{\theta})={\beta}^{T}h(x,{\alpha}) where h is a function with values inR^{p_{2}}. We prove the strong consistency of M estimators under very generalassumptions, assuming that h is a bounded function of {\alpha}, which includesthe three models mentioned above. Key words and phrases: Nonlinear regression,separable models, consistency, robust estimation.
展开▼